Development and Application of Fault Detectability Performance Metrics for Instrument Calibration Verification and Anomaly Detection
نویسندگان
چکیده
Traditionally, the calibration of safety critical nuclear instrumentation has been performed during each refueling outage. However, many nuclear plants have moved toward conditiondirected rather than time-directed calibration. This condition-directed calibration is accomplished through the use of on-line monitoring. On-line monitoring (OLM) commonly uses an autoassociative empirical modeling architecture to assess instrument channel performance. An autoassociative architecture predicts a group of correct sensor values when supplied a group of sensor values that is usually corrupted with process and instrument noise, and could also contain faults such as sensor drift or complete failure. This paper describes one such autoassociative model architecture, specifically autoassociative kernel regression (AAKR), and presents five metrics that may be used to evaluate performance. These metrics include the previously developed accuracy, auto sensitivity and cross sensitivity metrics along with a description of two new fault detectability performance metrics for application to instrument calibration verification (ICV) and anomaly detection. These parameters are calculated for an AAKR model of an operating nuclear power plant steam system and were used to describe the effects of model architecture on performance. It is shown that the ability of an empirical model to detect sensor faults in ICV systems is largely dependent on the model uncertainty and to a lesser degree its auto sensitivity. It is also shown that the ability of an empirical model to detect anomalies via the Sequential Probability Ratio Test (SPRT) is also related to uncertainty and the SPRT detectability is on the order of 50% smaller than the ICV detectability. These guidelines provide a framework for developing various models, in that models intended to be applied to ICV and anomaly detection tasks should focus on the minimization of uncertainty. Furthermore, the ICV and anomaly detection performance metrics are shown to be within the traditional +/-1% calibration tolerance and their performance under artificially faulted conditions are shown to be in direct agreement with their theoretical foundations.
منابع مشابه
Fault Strike Detection Using Satellite Gravity Data Decomposition by Discrete Wavelets: A Case Study from Iran
Estimating the gravity anomaly causative bodies boundary can facilitate the gravity field interpretation. In this paper, 2D discrete wavelet transform (DWT) is employed as a method to delineate the boundary of the gravity anomaly sources. Hence, the GRACE’ satellite gravity data is decomposed using DWT. DWT decomposites a single approximation coefficients into four distinct components: the appr...
متن کاملProbabilistic Soft Error Detection Based on Anomaly Speculation
Microprocessors are becoming increasingly vulnerable to soft errors due to the current trends of semiconductor technology scaling. Traditional redundant multithreading architectures provide perfect fault tolerance by re-executing all the computations. However, such a full re-execution technique significantly increases the verification workload on the processor resources, resulting in severe per...
متن کاملImproving the RX Anomaly Detection Algorithm for Hyperspectral Images using FFT
Anomaly Detection (AD) has recently become an important application of target detection in hyperspectral images. The Reed-Xialoi (RX) is the most widely used AD algorithm that suffers from “small sample size” problem. The best solution for this problem is to use Dimensionality Reduction (DR) techniques as a pre-processing step for RX detector. Using this method not only improves the detection p...
متن کاملApplication of Decision on Beliefs for Fault Detection in uni-variate Statistical Process Control
In this research, the decision on belief (DOB) approach was employed to analyze and classify the states of uni-variate quality control systems. The concept of DOB and its application in decision making problems were introduced, and then a methodology for modeling a statistical quality control problem by DOB approach was discussed. For this iterative approach, the belief for a system being out-...
متن کاملImpact of linear dimensionality reduction methods on the performance of anomaly detection algorithms in hyperspectral images
Anomaly Detection (AD) has recently become an important application of hyperspectral images analysis. The goal of these algorithms is to find the objects in the image scene which are anomalous in comparison to their surrounding background. One way to improve the performance and runtime of these algorithms is to use Dimensionality Reduction (DR) techniques. This paper evaluates the effect of thr...
متن کامل